9 research outputs found

    Similarity Learning via Kernel Preserving Embedding

    Full text link
    Data similarity is a key concept in many data-driven applications. Many algorithms are sensitive to similarity measures. To tackle this fundamental problem, automatically learning of similarity information from data via self-expression has been developed and successfully applied in various models, such as low-rank representation, sparse subspace learning, semi-supervised learning. However, it just tries to reconstruct the original data and some valuable information, e.g., the manifold structure, is largely ignored. In this paper, we argue that it is beneficial to preserve the overall relations when we extract similarity information. Specifically, we propose a novel similarity learning framework by minimizing the reconstruction error of kernel matrices, rather than the reconstruction error of original data adopted by existing work. Taking the clustering task as an example to evaluate our method, we observe considerable improvements compared to other state-of-the-art methods. More importantly, our proposed framework is very general and provides a novel and fundamental building block for many other similarity-based tasks. Besides, our proposed kernel preserving opens up a large number of possibilities to embed high-dimensional data into low-dimensional space.Comment: Published in AAAI 201

    Test Strategy Optimization Based on Soft Sensing and Ensemble Belief Measurement

    No full text
    Resulting from the short production cycle and rapid design technology development, traditional prognostic and health management (PHM) approaches become impractical and fail to match the requirement of systems with structural and functional complexity. Among all PHM designs, testability design and maintainability design face critical difficulties. First, testability design requires much labor and knowledge preparation, and wastes the sensor recording information. Second, maintainability design suffers bad influences by improper testability design. We proposed a test strategy optimization based on soft-sensing and ensemble belief measurements to overcome these problems. Instead of serial PHM design, the proposed method constructs a closed loop between testability and maintenance to generate an adaptive fault diagnostic tree with soft-sensor nodes. The diagnostic tree generated ensures high efficiency and flexibility, taking advantage of extreme learning machine (ELM) and affinity propagation (AP). The experiment results show that our method receives the highest performance with state-of-art methods. Additionally, the proposed method enlarges the diagnostic flexibility and saves much human labor on testability design

    Evolved-Cooperative Correntropy-Based Extreme Learning Machine for Robust Prediction

    No full text
    In recent years, the correntropy instead of the mean squared error has been widely taken as a powerful tool for enhancing the robustness against noise and outliers by forming the local similarity measurements. However, most correntropy-based models either have too simple descriptions of the correntropy or require too many parameters to adjust in advance, which is likely to cause poor performance since the correntropy fails to reflect the probability distributions of the signals. Therefore, in this paper, a novel correntropy-based extreme learning machine (ELM) called ECC-ELM has been proposed to provide a more robust training strategy based on the newly developed multi-kernel correntropy with the parameters that are generated using cooperative evolution. To achieve an accurate description of the correntropy, the method adopts a cooperative evolution which optimizes the bandwidths by switching delayed particle swarm optimization (SDPSO) and generates the corresponding influence coefficients that minimizes the minimum integrated error (MIE) to adaptively provide the best solution. The simulated experiments and real-world applications show that cooperative evolution can achieve the optimal solution which provides an accurate description on the probability distribution of the current error in the model. Therefore, the multi-kernel correntropy that is built with the optimal solution results in more robustness against the noise and outliers when training the model, which increases the accuracy of the predictions compared with other methods
    corecore